5 research outputs found

    Validation of Stereophotogrammetry of the Human Torso

    Get PDF
    The objective of this study was to determine if measurements of breast morphology computed from three-dimensional (3D) stereophotogrammetry are equivalent to traditional anthropometric measurements obtained directly on a subject using a tape measure. 3D torso images of 23 women ranged in age from 36 to 63 who underwent or were scheduled for breast reconstruction surgery were obtained using a 3dMD torso system (3Q Technologies Inc., Atlanta, GA). Two different types (contoured and line-of-sight distances) of a total of nine distances were computed from 3D images of each participant. Each participant was photographed twice, first without fiducial points marked (referred to as unmarked image) and second with fiducial points marked prior to imaging (referred to as marked image). Stereophotogrammetry was compared to traditional direct anthropometry, in which measurements were taken with a tape measure on participants. Three statistical analyses were used to evaluate the agreement between stereophotogrammetry and direct anthropometry. Seven out of nine distances showed excellent agreement between stereophotogrammetry and direct anthropometry (both marked and unmarked images). In addition, stereophotogrammetry from the unmarked image was equivalent to that of the marked image (both line-of-sight and contoured distances). A lower level of agreement was observed for some measures because of difficulty in localizing more vaguely defined fiducial points, such as lowest visible point of breast mound, and inability of the imaging system in capturing areas obscured by the breast, such as the inframammary fold. Stereophotogrammetry from 3D images obtained from the 3dMD torso system is effective for quantifying breast morphology. Tools for surgical planning and evaluation based on stereophotogrammetry have the potential to improve breast surgery outcomes

    3D Symmetry Measure Invariant to Subject Pose During Image Acquisition

    Get PDF
    In this study we evaluate the influence of subject pose during image acquisition on quantitative analysis of breast morphology. Three (3D) and two-dimensional (2D) images of the torso of 12 female subjects in two different poses; (1) hands-on-hip (HH) and (2) hands-down (HD) were obtained. In order to quantify the effect of pose, we introduce a new measure; the 3D pBRA (Percentage Breast Retraction Assessment) index, and validate its use against the 2D pBRA index. Our data suggests that the 3D pBRA index is linearly correlated with the 2D counterpart for both of the poses, and is independent of the localization of fiducial points within a tolerance limit of 7 mm. The quantitative assessment of 3D asymmetry was found to be invariant of subject pose. This study further corroborates the advantages of 3D stereophotogrammetry over 2D photography. Problems with pose that are inherent in 2D photographs are avoided and fiducial point identification is made easier by being able to panoramically rotate the 3D surface enabling views from any desired angle

    Automated Identification of Fiducial Points on 3D Torso Images

    No full text
    Breast reconstruction is an important part of the breast cancer treatment process for many women. Recently, 2D and 3D images have been used by plastic surgeons for evaluating surgical outcomes. Distances between different fiducial points are frequently used as quantitative measures for characterizing breast morphology. Fiducial points can be directly marked on subjects for direct anthropometry, or can be manually marked on images. This paper introduces novel algorithms to automate the identification of fiducial points in 3D images. Automating the process will make measurements of breast morphology more reliable, reducing the inter- and intra-observer bias. Algorithms to identify three fiducial points, the nipples, sternal notch, and umbilicus, are described. The algorithms used for localization of these fiducial points are formulated using a combination of surface curvature and 2D color information. Comparison of the 3D coordinates of automatically detected fiducial points and those identified manually, and geodesic distances between the fiducial points are used to validate algorithm performance. The algorithms reliably identified the location of all three of the fiducial points. We dedicate this article to our late colleague and friend, Dr. Elisabeth K. Beahm. Elisabeth was both a talented plastic surgeon and physician-scientist; we deeply miss her insight and her fellowship
    corecore